Efficient estimation of multidimensional regression model using multilayer perceptrons

نویسنده

  • Joseph Rynkiewicz
چکیده

This work concerns the estimation of multidimensional nonlinear regression models using multilayer perceptrons (MLPs). The main problem with such models is that we need to know the covariance matrix of the noise to get an optimal estimator. However, we show in this paper that if we choose as the cost function the logarithm of the determinant of the empirical error covariance matrix, then we get an asymptotically optimal estimator. Moreover, under suitable assumptions, we show that this cost function leads to a very simple asymptotic law for testing the number of parameters of an identifiable MLP. Numerical experiments confirm the theoretical results. keywords non-linear regression, multivariate regression, multilayer Perceptrons, asymptotic normality

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient estimation of multidimensional regression model with multilayer perceptron

Abstract. This work concerns estimation of multidimensional nonlinear regression models using multilayer perceptron (MLP). The main problem with such model is that we have to know the covariance matrix of the noise to get optimal estimator. however we show that, if we choose as cost function the logarithm of the determinant of the empirical error covariance matrix, we get an asymptotically opti...

متن کامل

Consistent estimation of the architecture of multilayer perceptrons

We consider regression models involving multilayer perceptrons (MLP) with one hidden layer and a Gaussian noise. The estimation of the parameters of the MLP can be done by maximizing the likelihood of the model. In this framework, it is difficult to determine the true number of hidden units using an information criterion, like the Bayesian information criteria (BIC), because the information mat...

متن کامل

Tensor Regression Networks with various Low-Rank Tensor Approximations

Tensor regression networks achieve high rate of compression of model parameters in multilayer perceptrons (MLP) while having slight impact on performances. Tensor regression layer imposes low-rank constraints on the tensor regression layer which replaces the flattening operation of traditional MLP. We investigate tensor regression networks using various low-rank tensor approximations, aiming to...

متن کامل

Efficient training of multilayer perceptrons using principal component analysis.

A training algorithm for multilayer perceptrons is discussed and studied in detail, which relates to the technique of principal component analysis. The latter is performed with respect to a correlation matrix computed from the example inputs and their target outputs. Typical properties of the training procedure are investigated by means of a statistical physics analysis in models of learning re...

متن کامل

Feature-Enriched Character-Level Convolutions for Text Regression

We present a new model for text regression that seamlessly combine engineered features and character-level information through deep parallel convolution stacks, multi-layer perceptrons and multitask learning. We use these models to create the SHEF/CNN systems for the sentence-level Quality Estimation task of WMT 2017 and Emotion Intensity Analysis task of WASSA 2017. Our experiments reveal that...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 69  شماره 

صفحات  -

تاریخ انتشار 2006